Online Rank-Revealing Block-Term Tensor Decomposition

نویسندگان

چکیده

The so-called block-term decomposition (BTD) tensor model, especially in its rank-(Lr,Lr,1) version, has been recently receiving increasing attention due to enhanced ability represent systems and signals that are composed of block components rank higher than one, a scenario encountered numerous diverse applications. Its uniqueness approximation have thus thoroughly studied. challenging problem estimating the BTD model structure, namely number terms (rank) their individual (block) ranks, is crucial importance practice only started attract significant attention. In data-streaming scenarios and/or big data applications, where size grows time or processing can be done incrementally, it essential able perform selection computation recursive (online/incremental) manner. this paper, novel approach tracking proposed, based on idea imposing column sparsity jointly factors ranks as numbers factor columns non-negligible magnitude. vein, using new rank-revealing batch algorithm starting point, an online method alternating reweighted least squares (RLS) type developed shown computationally efficient fast converging, also allowing change time. memory efficiency evaluated favorably compared with those reported scheme a-priori knowledge ranks. Simulation results both synthetic real (video) reported, demonstrate effectiveness proposed selecting correct model.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Block-Decoupling Multivariate Polynomials Using the Tensor Block-Term Decomposition

We present a tensor-based method to decompose a given set of multivariate functions into linear combinations of a set of multivariate functions of linear forms of the input variables. The method proceeds by forming a three-way array (tensor) by stacking Jacobian matrix evaluations of the function behind each other. It is shown that a blockterm decomposition of this tensor provides the necessary...

متن کامل

Coupled rank-(Lm, Ln, •) block term decomposition by coupled block simultaneous generalized Schur decomposition

Coupled decompositions of multiple tensors are fundamental tools for multi-set data fusion. In this paper, we introduce a coupled version of the rank-(Lm, Ln, ·) block term decomposition (BTD), applicable to joint independent subspace analysis. We propose two algorithms for its computation based on a coupled block simultaneous generalized Schur decomposition scheme. Numerical results are given ...

متن کامل

Learning Compact Recurrent Neural Networks with Block-Term Tensor Decomposition

Recurrent Neural Networks (RNNs) are powerful sequence modeling tools. However, when dealing with high dimensional inputs, the training of RNNs becomes computational expensive due to the large number of model parameters. This hinders RNNs from solving many important computer vision tasks, such as Action Recognition in Videos and Image Captioning. To overcome this problem, we propose a compact a...

متن کامل

Direction of Arrival and the Rank-revealing Urv Decomposition Direction of Arrival and the Rank-revealing Urv Decomposition

In many practical direction-of-arrival (DOA) problems the number of sources and their directions from an antenna array do not remain stationary. Hence a practical DOA algorithm must be able to track changes with a minimal number of snapshots. In this paper we describe DOA algorithms, based on a new decomposition, that are not expensive to compute or diicult to update. The algorithms are compare...

متن کامل

Tensor rank-one decomposition of probability tables

We propose a new additive decomposition of probability tables tensor rank-one decomposition. The basic idea is to decompose a probability table into a series of tables, such that the table that is the sum of the series is equal to the original table. Each table in the series has the same domain as the original table but can be expressed as a product of one-dimensional tables. Entries in tables ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Signal Processing

سال: 2023

ISSN: ['0165-1684', '1872-7557']

DOI: https://doi.org/10.1016/j.sigpro.2023.109126